Skip to content

Comments

feat: add real-time gaze ingestion and WebSocket streaming per session#45

Open
Avinash-Alapati wants to merge 3 commits intoruxailab:mainfrom
Avinash-Alapati:feature/realtime-gaze-streaming
Open

feat: add real-time gaze ingestion and WebSocket streaming per session#45
Avinash-Alapati wants to merge 3 commits intoruxailab:mainfrom
Avinash-Alapati:feature/realtime-gaze-streaming

Conversation

@Avinash-Alapati
Copy link

What this adds

  • REST endpoint POST /api/session/gaze to ingest live gaze points
  • WebSocket broadcasting using Flask-SocketIO
  • Session-based rooms so multiple observers can subscribe to the same session
  • Real-time push of gaze points to connected observers

Why this is needed

Currently the API supports calibration and batch prediction, but there is no support for streaming gaze data during live usability sessions.
This change adds the backend infrastructure required for real-time gaze visualization and future session replay, which is a core part of the GSoC project scope.

Scope

  • Non-breaking, additive change
  • Does not modify existing calibration or prediction routes
  • Uses in-memory buffering for now (to be replaced with persistent storage during full RUXAILAB integration)

Demo

Short demo showing REST → WebSocket live streaming using a test observer client:

Real-time.Gaze.Streaming.Demo.mp4

Next steps (planned)

  • Connect eye-tracking frontend to stream gaze during sessions
  • Integrate observer visualization in RUXAILAB UI
  • Persist gaze timeline for replay and analysis

@Avinash-Alapati Avinash-Alapati changed the title Feature/realtime gaze streaming feat: add real-time gaze ingestion and WebSocket streaming per session Jan 26, 2026
@Avinash-Alapati
Copy link
Author

Avinash-Alapati commented Jan 26, 2026

Hi @KarinePistili @jvJUCA , I’ve opened a PR adding session-based real-time gaze ingestion and WebSocket streaming (non-breaking, backend only).
It covers the core streaming infrastructure needed for live observation in RUXAILAB.
Whenever you have time, I’d really appreciate your feedback on the API design and whether this aligns with the intended session pipeline.
PR: #45

@Avinash-Alapati
Copy link
Author

Added a follow-up commit to integrate live gaze streaming from the calibration flow.

Now the frontend periodically posts gaze points to POST /api/session/gaze during training/validation, and the backend logs + broadcasts them via Socket.IO rooms per session.

This validates the full REST → WebSocket pipeline end-to-end with the actual eye-tracking flow (not just a mock client).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant